It's in Apple's newest products, just not on the lips of its executives Tuesday.AI 

What AI Technologies Were Featured at WWDC?

Since the release of ChatGPT last November, generative AI has become a mainstream phenomenon, comparable to Pandora’s infamous opening of the misery box. Startups and industry leaders are now racing to incorporate this smart feature into their existing code stacks and integrate the transformative potential of machine-generated content into their apps. In the current hype cycle, it is essential to promote genAI accomplishments to stand out amidst the flood of customizable chatbots and self-producing Powerpoint slide sellers in the market.

If Google’s latest I/O conference or Meta’s new dedicated development team were any indication, the tech industry’s biggest players are gearing up to go all genAI, too. Google’s event focused on the company’s AI ambitions for Bard and PaLM 2, perhaps even at the expense of announced hardware, including the Pixel Fold and 7a phones and the Pixel Tablet. From Gmail’s Smart Compose features to Camera’s Real Tone and Magic Editor, from Project Tailwind to 7a’s creative wallpapers, AI was first on the lips of all Alphabet executives at the Shoreline stage.

However, if you drank two fingers every time Apple mentioned it in its WWDC 2023 keynote, you’d be sober.

Zero – that many times the on-stage host uttered the phrase “artificial intelligence” at WWDC 2023. The closest artificial intelligence was “Air” and the term “machine learning” was uttered exactly seven times.

That doesn’t mean Apple isn’t investing heavily in AI research and development. During Tuesday’s keynote, the products on display were full of technology. The “autocorrect” features are powered by the device’s machine learning, as are Lock Screen live video (which uses it to synthesize frames) and the new Journal app’s inspiring, personalized writing prompts. PDF autofill features rely on machine vision systems to understand which fields go where — Health Apps’ new myopia test also does based on your child’s screen distance — while AirPods now tailor playback settings to your preferences and environmental conditions. All thanks to machine learning systems.

Apple just didn’t talk about it. At least not directly.

Even when talking about the cutting-edge features of the new Vision Pro headset – whether it’s natural language processing associated with its audio inputs, audio beam tracing, the black magic of machine vision, or what real-time hand gesture tracking and Optic ID require. – The discussion continued to focus on what the headset’s features can do for users. Not what headphones could do for the latest technology or competition for market superiority.

The closest Apple came during the event to openly describing the digital nuts and bolts that make up its machine learning system was a description of Vision Pro’s Persona feature. With the device’s apps heavily skewed toward gaming, entertainment, and communication, we never had a chance to get through this without having to make FaceTime calls while wearing them. Since a FaceTime call with everyone hidden behind a headset would defeat the purpose of a video call, Apple instead uses a complex machine learning system to digitally recreate the Vision Pro user’s head, torso, arms and hands — otherwise known as a “Personality.”

“After a quick registration process using vision Pro’s front sensors, the system uses an advanced encoder decoder, a neural network to create your digital personality,” Mike Rockwell, director of Apple’s technology development group, said during the event. “Trained on a diverse group of thousands of individuals, this network provides a natural representation that dynamically responds to facial and hand movements.”

Artificial intelligence was largely considered an afterthought throughout the event rather than a selling point, which greatly benefited Apple. Breaking away from the carnival-like atmosphere that currently surrounds generative AI development, Apple maintains its aloof and elite branding, but it also distances itself from Google’s aggressive technology push and also eases curious shoppers into facial joys. installed hardware.

Steve Jobs often used the phrase “it just works” to describe the company’s products – meaning they were meant to solve problems, not cause extra trouble for users – and it seems that Apple has rekindled this design philosophy at the dawn of space. the age of information processing. In our increasingly dysfunctional, unstable, and unstable society, the promise of simplicity and reliability, of something, anything working as advertised, could be just what Apple needs to get buyers to swallow the Vision Pro’s $3,600 asking price.

Related posts

Leave a Comment